Goto

Collaborating Authors

 attention pattern








dc6a7e655d7e5840e66733e9ee67cc69-AuthorFeedback.pdf

Neural Information Processing Systems

We thank all the reviewers for helpful suggestions. We will incorporate the following analysis into our revision. Firstly, we found 4 typical patterns shared by both, as shown in Figure 1. Attention patterns shared by XLNet and BERT . Rows and columns represent query and key respectively.


Exposing Attention Glitches with Flip-Flop Language Modeling

Neural Information Processing Systems

This simple generative task requires a model to copy binary symbols over long-range dependencies, ignoring the tokens in between. We find that Transformer FFLMs suffer from a long tail of sporadic reasoning errors, some of which we can eliminate using various regularization techniques.


Exposing Attention Glitches with Flip-Flop Language Modeling

Neural Information Processing Systems

This simple generative task requires a model to copy binary symbols over long-range dependencies, ignoring the tokens in between. We find that Transformer FFLMs suffer from a long tail of sporadic reasoning errors, some of which we can eliminate using various regularization techniques.